Assessing the validity of an IAU General English Achievement Test through hybridizing differential item functioning and differential distractor functioning

نویسندگان

چکیده

Abstract The current study sought to examine the validity of a General English Achievement Test (GEAT), administered university students in fall semester 2018–2019 academic year, by hybridizing differential information (DIF) and distractor function (DDF) analytical models. Using purposive sampling method, from target population undergraduate studying different disciplines at Islamic Azad University (IAU), (Isfahan branch), sample 835 taking GEAT were selected. 60-item multiple-choice test comprised four sub-sections; namely, vocabulary, grammar, cloze test, reading comprehension. students’ scores served as targeted data was examined through application Cochran-Mantel-Haenszel (CMH) multinomial log-linear regression models for detecting DIF DDF, respectively. To account assumption uni-dimensionality, sub-sections analyzed independently. Furthermore, local independence checked based on correlational analysis no extreme values observed. results identified five moderate-level items one DDF item signaling an adverse effect fairness due existing biased items. Notably, these findings may have important implications both language policymakers developers.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

an exploratory study of differential item functioning (dif) in efl reading comprehension

بررسی دلایل عملکرد متفاوت سوالات آزمون درک مطلب به زبان خارجی تاریخچه ی تحقیق درباره ی منابع عملکرد مختلف سوالات (دیف) در آزمون درک مطلب پر است از مجموعه ای از متغیرهای فرضیه ای که چند مورد از مهمترین آنها عبارتند از: جنسیت، آشنایی با موضوع متن، علاقه به موضوع یا محتوای متن، حدس زدن، و عوامل بافت اجتماعی (پی 2004؛ زومبو و گلین 2005). مطالعه حاضر با استناد به فلسفه ی انکارپذیری پوپر عوامل ذکر...

15 صفحه اول

Assessing Differential Item Functioning on the Test of Relational Reasoning

The test of relational reasoning (TORR) is designed to assess the ability to identify complex patterns within visuospatial stimuli. The TORR is designed for use in school and university settings, and therefore, its measurement invariance across diverse groups is critical. In this investigation, a large sample, representative of a major university on key demographic variables, was collected, and...

متن کامل

gender differential item functioning analysis of the university of tehran english proficiency test

the university of tehran english proficiency test (utept) is a high-stakes entrance examination taken by more than 10,000 master’s degree holders annually. the examinees’ scores have a significant influence on the final decisions concerning admission to the university of tehran ph.d. programs. as a test validation investigation, the present study, which is a bias detection research in nature, u...

متن کامل

Differential Item Functioning and Unidimensionality in the Pearson Test of English Academic

Since the Pearson Test of English Academic (PTE Academic) was designed to assess skill differences among test-takers at all points along the ability continuum, rather than to determine cutoff scores, it is important to examine the extent to which the instrument assesses what it is intended to measure (validity) as well as the extent to which the test is consistent (reliability) in measuring ELL...

متن کامل

Measuring differential item and test functioning across academic disciplines

Differential item functioning (DIF) is when a test item favors or hinders a characteristic exhibited by group members of a test-taking population. DIF analyses are statistical procedures used to determine to what extent the content of an item affects the item endorsement of sub-groups of test-takers. If DIF is found for many items on the test, the final test scores do not represent the same mea...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Language Testing in Asia

سال: 2021

ISSN: ['2229-0443']

DOI: https://doi.org/10.1186/s40468-021-00124-7